Mutual Information, Relative Entropy, and Estimation in the Poisson Channel

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Estimation of Entropy and Mutual Information

We present some new results on the nonparametric estimation of entropy and mutual information. First, we use an exact local expansion of the entropy function to prove almost sure consistency and central limit theorems for three of the most commonly used discretized information estimators. The setup is related to Grenander’s method of sieves and places no assumptions on the underlying probabilit...

متن کامل

Information Theory: Data Compression, Channel Coding, Entropy, and Mutual Information

This report gives an insight about two major aspects of information theory, including data compression and channel coding. It discusses the simulations results obtained using MATLAB from the information theory view point. It also defines these aspects by using the definitions of entropy and mutual information which reflects the optimum data compression could be obtained and the ultimate transmi...

متن کامل

Geometric k-nearest neighbor estimation of entropy and mutual information

Nonparametric estimation of mutual information is used in a wide range of scientific problems to quantify dependence between variables. The k-nearest neighbor (knn) methods are consistent, and therefore expected to work well for a large sample size. These methods use geometrically regular local volume elements. This practice allows maximum localization of the volume elements, but can also induc...

متن کامل

Efficient Entropy Estimation for Mutual Information Analysis Using B-Splines

The Correlation Power Analysis (CPA) is probably the most used side-channel attack because it seems to fit the power model of most standard CMOS devices and is very efficiently computed. However, the Pearson correlation coefficient used in the CPA measures only linear statistical dependences where the Mutual Information (MI) takes into account both linear and nonlinear dependences. Even if ther...

متن کامل

Information Theory 4.1 Entropy and Mutual Information

Neural encoding and decoding focus on the question: " What does the response of a neuron tell us about a stimulus ". In this chapter we consider a related but different question: " How much does the neural response tell us about a stimulus ". The techniques of information theory allow us to answer this question in a quantitative manner. Furthermore, we can use them to ask what forms of neural r...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Information Theory

سال: 2012

ISSN: 0018-9448,1557-9654

DOI: 10.1109/tit.2011.2172572